A Globally Convergent Conjugate Gradient Method for Minimizing Self-Concordant Functions on Riemannian Manifolds

نویسندگان

  • Huibo Ji
  • Jonathan H. Manton
  • John B. Moore
چکیده

Self-concordant functions are a special class of convex functions in Euclidean space introduced by Nesterov. They are used in interior point methods, based on Newton iterations, where they play an important role in solving efficiently certain constrained optimization problems. The concept of self-concordant functions has been defined on Riemannian manifolds by Jiang et al. and a damped Newton method developed for this context. As a further development, this paper proposes a damped conjugate gradient method, which is an ordinary conjugate gradient method but with a novel step-size selection rule which is proved to ensure the algorithm converges to the global minimum. The advantage of the damped conjugate gradient algorithm over the damped Newton method is that the former has a lower computational complexity. To illustrate the advantages, the algorithm is applied to find the center of mass of given points on a hyperboloid model, known as the Karcher mean.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A class of self-concordant functions on Riemannian manifolds

The notion of self-concordant function on Euclidean spaces was introduced and studied by Nesterov and Nemirovsky [6]. They have used these functions to design numerical optimization algorithms based on interior-point methods ([7]). In [12], Constantin Udrişte makes an extension of this study to the Riemannian context of optimization methods. In this paper, we use a decomposable function to intr...

متن کامل

A Geometry Preserving Kernel over Riemannian Manifolds

Abstract- Kernel trick and projection to tangent spaces are two choices for linearizing the data points lying on Riemannian manifolds. These approaches are used to provide the prerequisites for applying standard machine learning methods on Riemannian manifolds. Classical kernels implicitly project data to high dimensional feature space without considering the intrinsic geometry of data points. ...

متن کامل

Geometric Optimization Methods for Adaptive Filtering

The techniques and analysis presented in this thesis provide new methods to solve optimization problems posed on Riemannian manifolds. These methods are applied to the subspace tracking problem found in adaptive signal processing and adaptive control. A new point of view is offered for the constrained optimization problem. Some classical optimization techniques on Euclidean space are generalize...

متن کامل

Trust-region methods on Riemannian manifolds with applications in numerical linear algebra

A general scheme for trust-region methods on Riemannian manifolds is proposed. A truncated conjugate-gradient method is utilized to solve the trust-region subproblems. The method is illustrated several problems of numerical linear algebra.

متن کامل

توسیعی از یک روش گرادیان مزدوج سه بخشی مبتنی بر مقادیر تابع هدف با تضمین همگرایی بدون فرض تحدب

With respect to importance of the conjugate gradient methods for large-scale optimization, in this study a descent three-term conjugate gradient method is proposed based on an extended modified secant condition. In the proposed method, objective function values are used in addition to the gradient information. Also, it is established that the method is globally convergent without convexity assu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008